Efficient derivative-free Bayesian inference for large-scale inverse problems
نویسندگان
چکیده
Abstract We consider Bayesian inference for large-scale inverse problems, where computational challenges arise from the need repeated evaluations of an expensive forward model. This renders most Markov chain Monte Carlo approaches infeasible, since they typically require O ( 1 0 4 stretchy="false">) model runs, or more. Moreover, is often given as a black box impractical to differentiate. Therefore derivative-free algorithms are highly desirable. propose framework, which built on Kalman methodology, efficiently perform in such problems. The basic method based approximation filtering distribution novel mean-field dynamical system, into problem embedded observation operator. Theoretical properties established linear demonstrating that desired posterior by steady state law and proving exponential convergence it. suggests that, nonlinear problems close Gaussian, sequentially computing this provides basis efficient iterative methods approximate posterior. Ensemble applied obtain interacting particle system approximations model; practical strategies further reduce memory cost methodology presented, including low-rank bi-fidelity approach. effectiveness framework demonstrated several numerical experiments, proof-of-concept linear/nonlinear examples two applications: learning permeability parameters subsurface flow; subgrid-scale global climate stochastic ensemble filter various square-root filters all employed compared numerically. results demonstrate proposed method, competitive with pre-existing Kalman-based
منابع مشابه
Bayesian Inference in Large-scale Problems
Bayesian Inference in Large-scale Problems by James E. Johndrow Department of Statistical Science Duke University Date: Approved: David B. Dunson, Supervisor
متن کاملApproximate Expectation Propagation for Bayesian Inference on Large-scale Problems
where k indexes experimental replicates, i indexes the probe positions, j indexes the binding positions, andN ( jPj aji jjsjbj; i) represents the probability density function of a Gaussian distribution with mean Pj aji jjsjbj and variance i. We assign prior distributions on the binding event bj and the binding strength sj: p(bjj j) = bj j (1 j)1 bj (3) p0(sj) = Gamma(sjjc0; d0) (4) where Gamma(...
متن کاملBayesian Inference Tools for Inverse Problems
In this paper, first the basics of the Bayesian inference for linear inverse problems are presented. The inverse problems we consider are, for example, signal deconvolution, image restoration or image reconstruction in Computed Tomography (CT). The main point to discuss then is the prior modeling of signals and images. We consider two classes of priors: simple or hierarchical with hidden variab...
متن کاملBayesian Inference for Inverse Problems Occurring in Uncertainty Analysis
The inverse problem considered here is the estimation of the distribution of a nonobserved random variable X , linked through a time-consuming physical model H to some noisy observed data Y . Bayesian inference is considered to account for prior expert knowledge on X in a small sample size setting. A Metropolis-Hastings-within-Gibbs algorithm is used to compute the posterior distribution of the...
متن کاملFast Bayesian Inference for Computer Simulation Inverse Problems
Computer models that simulate phenomena regulated by complicated dependencies on unknown parameters are increasingly used in environmental applications. In this paper we present two such applications where the values of the parameters need to be inferred from scarce observations and abundant simulated output. One consists of a climate simulator and the other of a groundwater flow model. Detaile...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Inverse Problems
سال: 2022
ISSN: ['0266-5611', '1361-6420']
DOI: https://doi.org/10.1088/1361-6420/ac99fa